On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions
نویسندگان
چکیده
X dP dμ ln dP dμ dμ on a measure space (X,M, μ), does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We present two results in this regard: (i) we prove that, as in the case of classical relative-entropy, the measuretheoretic definitions of generalized relative-entropies, Rényi and Tsallis, are natural extensions of their respective discrete cases, (ii) we show that, ME prescriptions of measure-theoretic Tsallis entropy are consistent with the discrete case.
منابع مشابه
On Generalized Measures of Information with Maximum and Minimum Entropy Prescriptions
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...
متن کاملOn Measure-Theoretic aspects of Nonextensive Entropy Functionals and corresponding Maximum Entropy Prescriptions
Shannon entropy of a probability measure P , defined as − ∫ X dP dμ ln dP dμ dμ on a measure space (X,M, μ), is not a natural extension from the discrete case. However, maximum entropy (ME) prescriptions of Shannon entropy functional in the measure-theoretic case are consistent with those for the discrete case. Also it is well known that Kullback-Leibler relative entropy can be extended natural...
متن کاملGelfand-Yaglom-Perez Theorem for Generalized Relative Entropies
The measure-theoretic definition of Kullback-Leibler relative-entropy (KL-entropy) plays a basic role in the definitions of classical information measures. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic defi...
متن کاملGelfand-Yaglom-Perez theorem for generalized relative entropy functionals
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KLentropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. The...
متن کاملCombinatorial Information Theory: I. Philosophical Basis of Cross-Entropy and Entropy
This study critically analyses the information-theoretic, axiomatic and combinatorial philosophical bases of the entropy and cross-entropy concepts. The combinatorial basis is shown to be the most fundamental (most primitive) of these three bases, since it gives (i) a derivation for the Kullback-Leibler cross-entropy and Shannon entropy functions, as simplified forms of the multinomial distribu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/cs/0601080 شماره
صفحات -
تاریخ انتشار 2006